The use of mobile technology for work-based assessment: the student experience
نویسندگان
چکیده
This paper outlines a research project conducted at Leeds University School of Medicine with Assessment & Learning in Practice Settings Centre for Excellence in Teaching and Learning, collaboration between the Universities of Leeds, Huddersfield, Bradford, Leeds Metropolitan University and the University of York St John. The research conducted is a proof of concept, examining the impact of delivering competency based assessment via personal digital assistants (PDAs) amongst a group of final year undergraduate medical students. This evaluation reports the student experience of mobile technology for assessment with positive effects; concluding that overall the students found completing assessments using a PDA straight forward and that the structured format of the assessment resulted in an increased, improved level of feedback, allowing students to improve their skills during the placement. A relationship between using the PDA for learning and setting goals for achievement was clearly demonstrated. Introduction Using mobile technology for teaching and learning is a rapidly evolving area of educational research (Collins, 1996; Frohberg, 2002; Preece, 2000; Vavoula, Pachler & Kukulska-Hulme, 2009). The current research focus is in general evaluative, with a British Journal of Educational Technology Vol 42 No 2 2011 251–265 doi:10.1111/j.1467-8535.2009.01022.x © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. Published by Blackwell Publishing, 9600 Garsington Road, Oxford OX4 2DQ, UK and 350 Main Street, Malden, MA 02148, USA. variety of practitioners exploring the delivery, methodology and feasibility of mobile device usage in a wide range of education contexts, including the building of information technology (IT) infrastructure, technical support and other resources required. (Ally, 2009; Kukulska-Hulme & Traxler, 2005) Medical education research reflects this wider field with a significant amount of mobile learning research focusing on feasibility combined with data on user experience (Dearnley, Haigh & Fairhall, 2007; Fisher & Baird, 2006; Garrett & Jackson, 2006; Shim & Viswanathan, 2007; Triantafillou, Georgiadou & Economides, 2008). These studies outline the type of infrastructure used to support mobile learning (m-learning), the issues encountered when testing systems and report positive user experiences of the use of mobile technology for learning and workload management. A small amount of research in health care education has focused on the benefits of using technology for assessment, suggesting heightened understanding of assessment method by students and a link with improvement in the quality of students’ work (McGuire, 2005). In contrast, Kneebone and Brenton (2005) have found the use of personal digital assistants (PDAs) for assessed written reflection too slow and time consuming, reducing the quality of work. Several studies have identified the need to work with industry to create mobile devices specifically suited to learning (Kramer, 2009; Naismith, Lonsdale, Vavoula & Sharples, 2004) and new products on the market such as the iPhone are indicative of the progress made in this area so far. However, appropriate use of m-learning is key to the emerging attitudes of staff and students to more general application. Whilst cultural change is necessary for m-learning to become main stream within Higher Education Institutions overall, within the health and social care environment a major paradigm shift may be needed to accept this type of technology as an everyday part of learning amongst staff, students and patients. This is due to the somewhat unique curriculum design of health and social care courses where much of the teaching is delivered by educators not employed by the University in work-based settings, and a constant observation of students by the public whilst learning. Previous National Health Service (NHS) policies banning mobile phones in hospitals and other clinical areas tend to result in a ‘taboo’ attitude towards the use of mobile devices amongst health care practitioners and patients. This attitude is heightened by perceptions amongst staff and patients that students using mobile devices in clinical settings are doing so for personal use, such as texting or messaging their friends rather than for assessment and accessing reference material; mobile devices are not associated with learning (Koskimaa et al, 2007). Why mobile learning? Despite the challenges (and barriers) to m-learning in clinical contexts, we reasoned that there are still clear benefits to warrant exploration in this study. When students are in the workplace it is often difficult to know what they are learning and if they are improving. Work-based placements are typically assessed summatively at conclusion, with little formal formative assessment taking place prior to this. 252 British Journal of Educational Technology Vol 42 No 2 2011 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. As a result, students may not maximise their learning opportunities, or focus on improving areas of weakness when they have the opportunity to do so. Mobile assessments allow tutors to review students’ progress remotely, and for students to receive more feedback on a more frequent basis. Much has been written about the ‘net generation’ (Tapscott, 1998) and ‘digital natives’ (Prensky, 2001b) as a description of young people who have grown up surrounded by technology and the need to adapt teaching and learning to their experience and abilities. Whilst engaging with new technologies to ensure a modern and interesting learning experience is important for competitive growth in the University sector, whether students actually achieve better qualifications as a result of these technologies is debatable; and if students are in fact confident in the use of technology for education remains questionable. (Ramanau, Sharpe & Benfield, 2008). The findings of this paper, though too small to be considered representative of all students seem to present a challenge to the concept of students as digital natives and potentially support the findings of Ramanau, Sharpe and Benfield. This may be due to the attributes of a medical student population, (increasingly a diverse student group, from a wide range of backgrounds due to the widening participation agenda) or simply that not all young people are in fact digital natives. Context In this project the students were asked to complete competency assessments (MiniClinical Evaluation Exercise [CEX]) whilst on a work-based placement. Mini-CEX is a validated generic tool used to undertake assessment of core competencies within health care professions such as communication, physical examination, reasoning and practical skills. These assessments are most often used in Medicine in postgraduate professional portfolios (Holmboe, 2001). The participant identifies an opportunity to conduct an assessment with a clinical supervisor. Consent is sought from any patient examined as part of the assessment and the student then conducts the examination of the patient whilst a supervisor/assessor evaluates their performance. The evaluation is conducted using a form comprising of rater scales (1–9) to assess competence in various skills such as organisation, professionalism and clinical reasoning followed by an overall global competence rating. A free text box at the end of the form allows comments and action planning (see Figure A1 in the appendix for an example of the form used in this study). Mini-CEX is well established in postgraduate Medicine, often delivered via the Internet or work-based PC (Durning, 2002; Holmboe, 2001; Norcini, Blank, Duffy & Fortna 2003), and more recently (Hauer, 2000 and Kogan, Bellin & Shea, 2002) the practice of using mini-CEX with undergraduate students has been developed. Work based, formative assessment (mini-CEX) allows the student to identify areas of weakness and improve these by undergoing assessment and receiving feedback on their performance. This type of self-directed, personalised learning is attractive, as work based assessment of competence is of key practical use to medical students. Mobile technology for students 253 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. Based on our assumptions of the need to generate a method of assessment that would capture the needs of our ‘net generation’ and provide grounded feedback, we hypothesised that the use of mini-CEX via PDAs may achieve these goals. To assist the exploration of this hypothesis, we reviewed the small amount of research within health disciplines into the use of PDAs or other forms of mobile technology to deliver assessment (Axelson, Wardh, Strender & Nilsson, 2007; Finlay, Norman, Stolberg, Weaver & Keane, 2006; Kneebone et al, 2008). Work in Wisconsin with 3rd year medical students tested feasibility and user satisfaction with a PDA-based mini-CEX form (Torre, Simpson, Elnicki, Sebastian & Holmboe, 2007). Whilst the study found that students, faculty and residents were highly satisfied with the device to undertake assessment, the very small number of assessments undertaken precluded description of any incremental improvement in learning through using mini-CEX. The aim of this project was to pilot the use of a mini-CEX delivered via PDAs to support learning within an extended period of recapitulatory study for 13 final year students who had failed their final qualifying examinations. This period of recapitulatory study would take place in a work-based placement. On completion of the placement, students would retake their practical, final exams. Previous research (Draper, Cargill & Cutts 2002; Holmboe et al, 2004; Nicol & MacFarlane-Dick, 2006; Sadler, 1998) has shown the influence of formative assessment and feedback for learning, and identified that the actual amount of feedback that students receive when undertaking work placements is very little, and can be of a very poor quality (Black & Wiliam, 1998; Norcini & Burch, 2007). We felt that the recapitulatory nature of the placement was an excellent opportunity to use the mini-CEX assessment as the extensive use of structured assessment over a 12-week period would provide students with considerable feedback on their performance, especially beneficial to underperforming students. The structured, formative nature of the assessment combined with opportunistic learning opportunities captured by the PDA would provide high quality and consistency of feedback allowing students and staff to review student progress throughout the placement and shape the direction of their learning accordingly. Methodology We sought to explore attitudes and behaviours associated with the use of mobile technology in this delivery of formative assessment. Therefore a grounded theory approach was chosen to elicit an in depth reflective response from the students and assessors regarding their experiences. Grounded theory mirrors much of the inductive nature of qualitative research, developing concepts and themes as the research process evolves that are ‘grounded’ in the data (Glaser & Strauss, 1968). By utilising existing theory as a base, we were able to develop a project that would build on existing good practice and provide a deeper analysis (Strauss & Corbin, 1990). Initially, four of the 13 students due to undertake the placement were invited to try using the mobile devices and the mini-CEX forms. The outline of the project was explained. The students would be expected to conduct a minimum of eight mini-CEX assessments in four areas of competence, communication, physical examination, clini254 British Journal of Educational Technology Vol 42 No 2 2011 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. cal reasoning and practical skills over a 12-week remediation placement. The students were to conduct at least two assessments in each area of competence. All assessments would be formative in nature. The placement would be summatively assessed by casebased discussion (a presentation by the student of a particular patient’s treatment, care planning, etc) as per routine practice. These four students reported that the project was workable so a training session was arranged for the entire cohort of 13 students a week later. At this training session the students were issued with a T-mobile Vario 2 Mobile Digital Assistant device, loaded with Windows Mobile 5 and mForms (software used to create mini-CEX forms). It included an ‘unlimited data connection with a fair use policy of 1 Gb per month’. The students were trained how to use the devices and the mini-CEX forms, email access provided and the project outlined. Although some students were quite familiar with the concept of PDAs and would probably have not required much training to use the device (evidenced by several running ahead of the training), others were complete novices and needed considerable support. None of the students had encountered mForms previously so a large proportion of the session was spent exploring how it worked and resolving initial problems. The initial face-to-face training was important to the success of the project as we were aware that it would have been nearly impossible to train all clinical staff students might come into contact with. Therefore, it was important that students felt confident with the PDA as they would need to explain the use of it to assessors. However, whilst we expected some assessors to need assistance with device usage, we did expect the majority of the assessors to be familiar with mini-CEX assessment, and many to have had experience of using this either via the Internet or work-based PC. Students were told they would be expected to complete a feedback questionnaire at the end of the placement and attend a focus group. The questionnaire developed by the research team was open-ended and themed into sections, such as usability, assessment feasibility and future embedding derived from our reading of current literature (see example of questions in Figure A2 of the Appendix). This questionnaire was disseminated via email to students after completion of summative assessments. Of the 13 students that completed the placement, 10 filled in the questionnaire and two did not fill in the questionnaire but attended the focus group. In addition to the questionnaire, the physical assessment data collated via the PDA was also subject to analysis by two researchers through a process of reading and coding themes within the free text feedback section of the mini-CEX tool. Each student’s individual data was studied for patterns in grading throughout the placement (an example in Figure A3 of the Appendix). As a further method of triangulation, we held a focus group attended by seven students. Having conducted initial coding, two members of the research team constructed three prompts. (‘Ice breakers’, ‘additional usage of the PDA’ and ‘any differences in feedback received’) based on questionnaire responses to initiate discussion at the group. DiscusMobile technology for students 255 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. sion was held around each of these themes and the research team took notes of the discussion content, noting down particular quotes verbatim. In triangulating the data collated from the assessor and student questionnaires, focus group and assessment data we were able to construct individual ‘student learning journeys’. That is, view which assessments the student had conducted and when and how their scores and feedback had developed throughout the placement. Study limitations This study is small scale and based on the views of 13 medical students. The student experience of using PDAs for assessment may vary dependent on professional context and setting. The students involved are all recapitulatory students, in need of additional support following failure at final exams therefore the views expressed are not representative of the whole student population. A larger scale study with an entire cohort of students is needed to assess the attitude towards this method of assessment as well as the feasibility of mobile assessment as a wider curriculum component in terms of infrastructure, technical support and staff training. Findings Across the questionnaire and focus group, a total of 12 of the 13 recapitulatory students gave their feedback. Seven were male and five female (age 22–26). Where appropriate direct quotes from students have been used to illustrate findings. Analysis of the PDA revealed students conducted 196 assessments in total (median 15, range 8–25). Assessments were conducted by 80 assessors (41 were junior hospital doctors, 29 were Senior [Registrar/Consultant] and 10 were Allied Health Professions). The median time for a mini-CEX assessment plus feedback was 15 minutes for observation, 8 minutes for feedback; in keeping with that seen in early postgraduate practice (Norcini & Burch, 2007). Feedback in text form was available in 67% of the assessments completed. In 33% of cases, no comments were added. The triangulated data was themed into the areas highlighted below. Attitudes to device in clinical areas The research team were interested in the response of others (clinicians, other hospital staff and patients) to the use of PDAs in a clinical setting given potential concerns with confidentiality and infection control. Three students reported encountering a ‘mixed’ response, with some colleagues very open to the technology and others resistant. Surprising was the number of positive comments, and it became apparent that we had overestimated resistance considerably. They really liked it. [It]Enabled us to speak to other members of the team—gave us a reason (Student 5). 256 British Journal of Educational Technology Vol 42 No 2 2011 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. The same student later comments that this was a key difference between this placement and previous placements, We were able to have a reason to chat to the staff (Student 5). It would appear that instead of creating barriers by using such a device in this setting, in this instance it actually broke some down. The PDA acted as a catalyst for cultural change (Wenger, 1999), and the resulting dialogue can be regarded as a significant advantage to students when moving into a new clinical placement and working with an unfamiliar team. Attitudes to learning using a PDA by students All of the students found the PDA comparatively easy to use, reflecting the findings of Torre et al (2007) and Kenny, Park, Neste-Kenny, Burton and Meiers (2009), though the three commented that they found it a bit ‘fiddly’ to fill in. This comment was explored at the focus group and traced to the inputting of free text assessor comments. This reflects the findings of Kneebone and Brenton’s study, where writing free text entries was abandoned midway through the project due to practicality. Rekkedal and Dye (2009) solved this problem by providing students with keyboards in addition to the PDA. In our study, inputting free text comments could have been resolved by the use of the audio function of the device. For future studies this was suggested as the most practical way of collecting feedback. These issues demonstrate the need for a review of how mobile technology is currently used (Kukulska-Hulme & Pettit, 2009) to ensure appropriate design of learning material. The research team were interested to know what other functions of the device the students used to benefit themselves and their learning. Nine students used the device for accessing their email, three to access their diary and seven to access the Internet. This mirrors the usage reported by PDA users in Kukulska-Hulme and Pettit (2009) study. Four of the students commented that they would have accessed electronic resources such as the electronic British National Formulary and an e-version of clinical handbooks if these options had been available, echoing the comments of nurses in Garrett and Klein’s (2008) study where access to reference materials was cited as a significant factor in the adoption of PDAs for use in the workplace. Nine out of the ten students that completed the questionnaire said they would find it helpful to have such a device when qualified. Attitudes to assessment and assessor behaviour by students Two students did suggest that if the assessors had their own login, assessments could be completed later at the clinician’s convenience. However, this option would have allowed for assessments to be left unfinished and the chances for obtaining feedback would decrease considerably. Additionally the timely delivery of feedback is seen as a critical factor in student assimilation of learning points (Kneebone et al, 2008). Another student commented, Mobile technology for students 257 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. the actual process of completing the assessments did help in terms of defining goals for us personally as well as giving our assessors an idea of the goals we were working towards so they knew how to feedback. I think it is a good idea to keep the assessments simple as they are in current format (Student 10). Three students noticed that the junior doctors seemed more open to or familiar with the devices. This theme was strengthened as when all students were asked whom they tended to ask to fill in assessments, over half specified junior doctors. Whilst the students were commenting on the junior doctors familiarity with the devices, it is also possible that these junior doctors were also more familiar with the mini-CEX itself, as many of them will have utilised these assessments in postgraduate education. All the students had to demonstrate how to use the device to their assessors. When asked how they felt about doing this seven were happy to do so, I had no problems doing this ... especially if it provided a means of obtaining useful feedback from them. And usually they did not require much prompting as it was fairly straight forward to use (Student 7). Another added, ‘it was good for once teaching them something!’ (Student 3) However the effect of this practice was questioned, [I] feel because I was needed to fill it in, it may have altered their feedback (Student 8). Confidence versus digital nativity All the students felt that the initial training was necessary and useful, and two felt reassured by having a helpline number to call. This positive response to training, and the attitude of the students observed in this pilot and our current larger study indicates that in this case many students did not feel automatically comfortable with the technology, mirroring the experience of both Kneebone and Brenton (2005) and Garrett and Klein (2008) where user support was seen as key by both participants and researchers. These findings would seem to challenge the idea of all young people being ‘digital natives’, and the need to utilise technology enhanced learning (TEL) for engagement. The use of the PDA was intended to provide the instant feedback and therefore gratification supposedly sought by this generation of students (Prensky, 2001), in a format they would feel comfortable using. However; the lack of confidence observed in these studies indicates that caution is needed when using TEL that digital competence is not assumed. Overall impact Students were finally asked to compare this placement with their previous placements, and whilst two did not seem to be aware of a specific difference, over half found the placement more ‘focused’ and goal orientated. 258 British Journal of Educational Technology Vol 42 No 2 2011 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. I think that with the device I felt much more focused towards particular goals and it was also a good way for me to keep an eye on how I was doing (Student 2). Analysis of the actual assessment scores and feedback allows us to chart a journey of incremental improvement from each student as they progressed through the placement, echoing the students’ own feelings of progression (Figure A3). These learning journeys are a powerful pedagogical tool for both faculty and students alike. The recognisable incremental improvement observed throughout is a convincing justification for the inclusion of formative assessment as part of curriculum assessment strategy and personally provided the students with an improved feeling of confidence. Every student felt the formative assessments helped them to prepare for the summative presentation and final examinations, and all 13 students passed the final exam on the second attempt; however, larger scale research is needed to prove the effect of formative assessment on summative exam performance. Discussion Our study demonstrates a measurable improvement in student performance through the use of structured formative assessment, echoing the results seen in Kogan et al’s (2002) study using mini-CEX with undergraduates. Our students reported a higher level of feedback on assessment during this placement, supporting the claim that often feedback during placements can be minimal (Black & Wiliam, 1998, Norcini & Burch, 2007), this reiterates the importance of formative assessment for learning (Draper et al 2002; Sadler, 1998) as part of an overall curriculum strategy. The use of the PDA for assessment delivery provided some interesting and unexpected findings. Within this study, students found that technology acted as an ice breaker, encouraging engagement amongst nonmedical ward staff that historically the students had not encountered. In this way, the PDA as an object became a representation of the experience taking place, a process of reification. The introduction of a new device for learning into the community of practice facilitated an opportunity for the staff and students to open a dialogue and further develop shared meaning and experience (Wenger, 1999). This phenomenon, could be a considerable benefit for future cohorts, increasing wider inter-professional working and learning. Unfortunately there is a possibility that the new found dialogue could be short lived due to the ‘novelty factor’ resulting from unawareness of these devices. Therefore, consolidation work must be undertaken to prevent this from happening. In terms of the assessment process itself, one student suggested approaching patients for feedback via the PDA in addition to teacher/assessors. Whilst certain health and social care professions (such as social work) actively involve patients in teaching and curriculum design using patients for assessment in the workplace is rare. Involving patients in assessment is an ongoing stream of work within the Assessment & Learning in Practice Settings programme, and is increasingly apparent in postgraduate medical education. Mobile technology for students 259 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. Although previous work has identified that training is key to implementing mobile projects (Dearnley et al, 2007; Garrett & Klein, 2008; Kneebone & Brenton, 2005), perhaps the most surprising finding was the reluctance and anxiety on the part of some of the students towards using the device. These students were typically aged 23–24 years old, and were a very heterogeneous group in terms of their IT ‘nativity’. Although this work naturally prevents generalisation, it would appear that perhaps whilst age and computer literacy are clearly linked, age is not the only variable that governs our response to new technologies and personality traits and maturity in learning are likely to be equally important. The concept that students do not know how to use technology for learning purposes is also a possibility (Ramanau et al, 2008). Conclusion The use of PDA-based mini-CEX with final year recapitulatory students provided a key opportunity to undertake a large number of assessments within an intensive revision period and provided students with considerably more opportunity for feedback on their skills. All of the students, whilst some found the actual number of assessments to be done hard work, found the placement beneficial and enjoyable, and one commented it was the best placement they had been on, I have really enjoyed this placement. As compared to other placements I think I had a clearer sense of my goals and what I wanted to achieve. I really felt supported by people on the team and felt learning opportunities were made easily available (Student 9). The use of the PDA encourages immediate feedback that the students in our study found really useful; however, there is ongoing debate regarding the optimum time to provide students with feedback (Kneebone et al, 2008; McOwen, Kogan and Shea, 2008). The structure provided by the mini-CEX assessments seemed to encourage them to be more ‘goal orientated’. Additionally, the ownership the students had of their device was echoed in their learning, the students were much more aware of the selfdirected nature of this learning experience and the opportunity for personalised learning. This theme is present in other m-learning research (Hartnell-Young & Vetere, 2008; Sharples, Taylor & Vavoula, 2005) and is a significant advantage to the use of mobile technology for learning. A high level of structure and direction is beneficial during a recapitulatory placement and the use of the mini-CEX and PDA provided both. Implications of the study provide an initial proof of concept for those considering the use of mobile devices in work-based assessment or for providing students with a link to the University when on work placement. If mobile learning can be implemented within clinical or a workplace setting, then there is scope for implementation of mobile learning and assessment across a wide variety of educational situations. With the development of specialist academies (DFCS, 2007), set up to provide young people with the skills employers are seeking, the future of work-based learning is on the ascent. Increasingly many more university programmes include work placements to enhance courses, 260 British Journal of Educational Technology Vol 42 No 2 2011 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. engaging with employers and increasing their graduates’ employability. These placements provide students with unique opportunities, but can also leave them isolated. In addition to this, universities can find it hard to quality assure the learning that takes place within these work placements. Mobile technology and assessment offers one solution to this, with further research needed into cost effectiveness, quality assurance and IT infrastructure. Further research The majority of current educational m-learning research is still small scale (Ally, 2009; Kukulska-Hulme & Traxler, 2005) and this is reflected in our own health centric study. A large scale study is needed to establish the feasibility of mobile learning as a sustainable curriculum component. Our current work concentrates on the use of PDA-based mini-CEX with the whole 5th year during their final year placements. Further areas for research to better our understanding of ‘what’ and ‘how’ technology can improve learning and assessment might include longitudinal studies examining student attitudes and attainment, behaviours and changes in learning styles. In scoping forward, it is important to acknowledge that PDAs in researching teaching and learning allow us access to placement learning information we have not previously had access to, opening a new opportunity for research. AcknowledgementsAcknowledgements go to Shervanthi Homer-Vanniasinkam, University of Leeds fordevelopment of Mini-CEX form, Gareth Frith, University of Leeds for procurement ofdevices and Trudie Roberts, University of Leeds for concept development. ReferencesAlly, M. (2009). Mobile learning: transforming the delivery of education and training. Alabama, USA:AU Press.Axelson, C., Wardh, I., Strender, L. E. & Nilsson, G. (2007). Using medical knowledge studies onhandheld computers—a qualitative study among junior doctors. Medical Teacher, 29, 6, 611–618.Black, P. J. & Wiliam, D. (1998). Inside the black box: raising standards through classroom assessment.London: Kings College London School of Education.Collins, B. (1996). Tele-learning in a digital world: the future of distance learning. London: Interna-tional Thompson Computer Press.Dearnley, C., Haigh, J. & Fairhall, J. (2007). Using mobile technologies for assessment and learn-ing in practice settings: a case study. Nurse Education in Practice, 8, 3, 197–204.Department for Children Schools and Families (2007). Building on the best: final report and imple-mentation plan of the review of the 14-19 work related learning. Retrieved February 10, 2009,from http://www.dcsf.gov.uk/14-19/documents/14-19workrelatedlearning_web.pdfDraper, S. W., Cargill, J. & Cutts, Q. (2002). Electronically enhanced classroom interaction. Aus-tralian Journal of Educational Technology, 18, 1, 13–23.Durning, S. J., Cation, L. J., Markert, R. J. & Pangaro, L. N. (2002). Assessing the reliability andvalidity of the mini-clinical evaluation exercise for internal medicine residency training. Aca-demic Medicine, 77, 900–904.Mobile technology for students 261 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. Finlay, K., Norman, G. R., Stolberg, H., Weaver, B. & Keane, D. R. (2006). In-training evaluationusing handheld computerized clinical work sampling strategies in radiology residency. Journalof Canadian Association of Radiology, 57, 232–237.Fisher, M. & Baird, D. E. (2006). Making mLearning work: utilizing mobile technology for activeexploration, collaboration, assessment, and reflection in higher education. Journal of EducationTechnology Systems, 35, 1, 3–30.Frohberg, D. (2002). Communitiesthe MOBIlearn perspective. Workshop on ubiquitous andmobile computing for educational communities: enriching and enlarging community spaces,international conference on communities and technologies, Amsterdam, 19 September 2003.Retrieved February 10, 2009, from http://www.idi.ntnu.no/~divitini/umocec2003/Final/frohberg.pdfGarrett, B. & Klein, G. (2008). Value of wireless personal digital assistants for practice: percep-tions of advanced practice nurses. Journal of Clinical Nursing, 17, 16, 2146–2154.Garrett, B. M. & Jackson, C. (2006). A mobile clinical E-portfolio for nursing and medical stu-dents, using wireless personal digital assistants. Nurse Education in Practice, 6, 339–346.Glaser, B. & Strauss, A. L. (1968). Time for dying. Chicago: Aldine.Hartnell-Young, E. & Vetere, F. (2008). A Means of Personalising Learning: Incorporating Old andNew Literacies in the Curriculum with Mobile Phones. Curriculum Journal, 19, 4, 283–292.Hauer, K. E. (2000). Enhancing feedback to students using the mini-CEX (clinical evaluationexercise). Academic Medicine, 75, 534.Holmboe, E. S., Fiebach, N. H., Galatay, L. A. & Huot, S. (2001). Effectiveness of a focusededucational intervention on resident evaluations from faculty: a randomized controlled trial.Journal of General Internal Medicine, 16, 427–434.Holmboe, E. S., Williams, F., Yepes, M., Norcini, J. J., Blank, L. L. & Huot, S. J. (2004). The miniclinical examination exercise and interactive feedback: preliminary results. Journal of GeneralInternal Medicine, 16, 1, 100.Kenny, R. F., Park, C., Van Neste-Kenny, J., Burton, P. A. & Meiers, J. (2009). Usingmobile learning to enhance the quality of nursing practice education. In M. Ally (Ed.), Mobilelearning: transforming the delivery of education and training (pp. 51–75). Alabama, USA: AUPress.Kneebone, R. & Brenton, H. (2005). Training perioperative specialist practitioners. In A.Kukulska-Hulme & J. Traxler (Ed.), Mobile learning: a handbook for educators and trainers (pp.106–115). London: Routledge.Kneebone, R., Bello, F., Nestel, D., Mooney, N., Codling, A., Yadollahi, F. et al (2008). Learner-centered feedback using remote assessment of clinical procedures. Medical Teacher, 30, 8,795–801.Kogan, J. R., Bellin, L. M. & Shea, J. A. (2002). Implementation of the mini-CEX to evaluatemedical students’ clinical skills. Academic Medicine, 77, 1156–1157.Koskimaa, R., Lehtonen, M., Heinonen, U., Ruokama, H., Tisarri, V., Vahtivuori-Hänninen, S. et al(2007). A cultural approach to networked-based mobile education. International Journal ofEducational Research, 46, 3–4, 204–214.Kramer, M. (2009) Deciphering the future of learning through daily observation. Paper presentation,3rd WLE Mobile Learning Symposium: Mobile learning Cultures across Education, Work andLeisure, 27 March, WLE Centre, IOE London, UKKukulska-Hulme, A. & Pettit, A. J. (2009). Using mobile learning to enhance the quality ofnursing practice education. In M. Ally (Ed.), Mobile learning: transforming the delivery of educa-tion and training (pp. 51–75). Alabama: AU Press.Kukulska-Hulme, A. & Traxler, J. (2005). Mobile learning: a handbook for educators and trainers.London: Routledge.McGuire, L. (2005). Assessment using new technology. Innovations in Education and TeachingInternational, 42, 3, 265–276.McOwen, K. S., Kogan, J. R. & Shea, J. A. (2008). Elapsed Time Between Teaching and Evaluation:Does it Matter? Academic Medicine, 83, 10, S29–32.262 British Journal of Educational TechnologyVol 42 No 2 2011 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta. Naismith, L., Lonsdale, P., Vavoula, G. & Sharples, M. (2004). Report 11: literature Review inmobile technologies and learning. A report for NESTA Futurelab. University of Birmingham.Retrieved February 3, 2009, from http://futurelab.org.uk/research/reviews/reviews_11_and12/11_02.htmNicol, D. & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: amodel and seven principles of good feedback practice. Studies in Higher Education, 31, 2, 199–218.Norcini, J. & Burch, V. (2007). Workplace-based assessment as an educational tool: AMEE GuideNo.31. Medical Teacher, 29, 9, 855–871.Norcini, J. J., Blank, L. L., Duffy, F. D. & Fortna, G. (2003). The mini CEX: a method for assessingclinical skills. Annals of Internal Medicine, 138, 476–481.Preece, J. (2000). Online communities: designing usability, supporting sociability. Chichester: Wiley.Prensky, M. (2001a). Digital natives, digital immigrants, on the horizon. NCB University Press, 9,5, 1–6.Prensky, M. (2001b). Digital natives, digital immigrants, Part II: do they really think differently?on the horizon. NCB University Press, 9, 6, 1–7.Ramanau, R., Sharpe, R. & Benfield, G. (2008). Exploring patterns of student learning technology usein their relationship to self-regulation and perceptions of learning community. Sixth InternationalConference of Networked Learning, Denmark, 5–6 May 2008. Retrieved 10th February2009 from http://www.networkedlearningconference.org.uk/abstracts/PDFs/Ramanau_334-341.pdfSadler, R. (1998). Formative assessment: revisiting the territory. Assessment in Education: Prin-ciples, Policy & Practice, 5, 1, 77–84.Sharples, M., Taylor, J. & Vavoula, G. (2005). Towards a theory of mobile learning. Proceedingsof mLearn 2005 Conference, Cape Town [online] Accessed August 5th 2008 fromhttp://www.lsri.nottingham.ac.uk / msh / Papers / Towards%20a%20theory%20of%20mobile%20learning.pdfShim, S. J. & Viswanathan, V. (2007). User assessment of personal digital assistants used inpharmaceutical detailing: system features, usefulness and ease of use. Journal of ComputerInformation Systems, 48, 1, 14–21.Strauss, A. & Corbin, J. (1990). Basics of qualitative research: grounded theory procedures and tech-niques. California: Sage.Tapscott, D. (1998). Growing up digital: the rise of the net generation. New York: McGraw-HillCompanies.Torre, D. M., Simpson, D. E., Elnicki, D. M., Sebastian, J. L. & Holmboe, E. S. (2007). Feasibility,Reliability and User satisfaction With a PDA-Based Mini-CEX to Evaluate the Clinical Skills ofThird-Year Medical Students. Teaching and Learning in Medicine, 19, 3, 271–277.Triantafillou, E., Georgiadou, E. & Economides, A. A. (2008). The design and evaluation of acomputerized adaptive test on mobile devices. Computers & Education, 50, 4, 1319–1330.Vavoula, G., Pachler, N. & Kukulska-Hulme, A. (2009). Researching mobile learning: frameworks,tools and research designs. Oxford, UK: Peter Lang.Wenger, E. (1999). Communities of practice: learning, meaning, and identity. Cambridge, UK: Cam-bridge University Press.Mobile technology for students 263 © 2009 The Authors. British Journal of Educational Technology © 2009 Becta.
منابع مشابه
Design, Implementation and Evaluation of Azmer Online Quiz Application Based on Technology Acceptance Model (TAM): A pilot study
Introduction: Acceptance and intention to use the mobile device in the student evaluation is an interesting topic in education. Although there are significant studies of mobile learning acceptance and mobile-based assessment (MBA), there is little research on app design and driving factors that influence students' intention to use mobile technology for assessment purposes. The purpose of this ...
متن کاملEvaluating Students’ attitudes and usage e of mobile in educational activities at Paramedical Sciences School
Introduction: Mobile learning is a new way of modern teaching method and a subset of e-learning that refers to a change in thinking about the design and planning of learning goals and environments. The purpose of this study was to investigate the attitude of students of the Medical Sciences School toward the mobile phone in educational activities and its usage. Methods: 150 students of Mashhad...
متن کاملInvestigating the Quality of Persian Mobile Applications Related to Patients with Chronic Diseases
Introduction: Today, the use of mobile applications to help self-care in patients with chronic diseases has increased. The objective of this study was to investigate the quality of Persian mobile applications related to patients with diabetes and hypertension. Method: This analytical study was conducted on all Persian mobile applications related to diabetes and hypertension in 2019. The mobile ...
متن کاملInvestigating the Quality of Persian Mobile Applications Related to Patients with Chronic Diseases
Introduction: Today, the use of mobile applications to help self-care in patients with chronic diseases has increased. The objective of this study was to investigate the quality of Persian mobile applications related to patients with diabetes and hypertension. Method: This analytical study was conducted on all Persian mobile applications related to diabetes and hypertension in 2019. The mobile ...
متن کاملUse of Evidence-Informed Deliberative Processes – Learning by Doing; Comment on “Use of Evidence-informed Deliberative Processes by Health Technology Assessment Agencies Around the Globe”
The article by Oortwijn, Jansen, and Baltussen (OJB) is much more important than it appears because, in the absence of any good general theory of “evidence-informed deliberative processes” (EDP) and limited evidence of how they might be shaped and work in institutionalising health technology assessment (HTA), the best approach seems to be to accumulate the experience of...
متن کاملAssessment of Use of Selected Information Communication Technologies (ICTs) for Extension Service Delivery: Implication for Agricultural Development in Nigeria
The study was carried out to assess the implication of the use of selected Information and Communication Technologies (ICTs) for extension service delivery in Nigeria. The simple random technique was used to select fifty percent of the total extension agents which equivalent to forty-two respondents as sample size for the study. Frequency counts, percentages, mean and standard deviation were us...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- BJET
دوره 42 شماره
صفحات -
تاریخ انتشار 2011